16 research outputs found

    BodySpace: inferring body pose for natural control of a music player

    Get PDF
    We describe the BodySpace system, which uses inertial sensing and pattern recognition to allow the gestural control of a music player by placing the device at different parts of the body. We demonstrate a new approach to the segmentation and recognition of gestures for this kind of application and show how simulated physical model-based techniques can shape gestural interaction

    GpsTunes: controlling navigation via audio feedback

    Get PDF
    We combine the functionality of a mobile Global Positioning System (GPS) with that of an MP3 player, implemented on a PocketPC, to produce a handheld system capable of guiding a user to their desired target location via continuously adapted music feedback. We illustrate how the approach to presentation of the audio display can benefit from insights from control theory, such as predictive 'browsing' elements to the display, and the appropriate representation of uncertainty or ambiguity in the display. The probabilistic interpretation of the navigation task can be generalised to other context-dependent mobile applications. This is the first example of a completely handheld location- aware music player. We discuss scenarios for use of such systems

    A proton-cyclotron wave storm generated by unstable proton distribution functions in the solar wind

    Get PDF
    We use audification of 0.092 s cadence magnetometer data from the Wind spacecraft to identify waves with amplitudes >0.1 nT near the ion gyrofrequency (~0.1 Hz) with duration longer than 1 hr during 2008. We present one of the most common types of event for a case study and find it to be a proton-cyclotron wave storm, coinciding with highly radial magnetic field and a suprathermal proton beam close in density to the core distribution itself. Using linear Vlasov analysis, we conclude that the long-duration, large-amplitude waves are generated by the instability of the proton distribution function. The origin of the beam is unknown, but the radial field period is found in the trailing edge of a fast solar wind stream and resembles other events thought to be caused by magnetic field footpoint motion or interchange reconnection between coronal holes and closed field lines in the corona

    A Reactive Environment for Dynamic Volume Control

    Get PDF
    El-Shimy D, Hermann T, Cooperstock J. A Reactive Environment for Dynamic Volume Control. In: Essl GE, Gillespie B, Gurevich M, O'Modhrain S, eds. Proceedings of the International Conference on New Interfaces for Musical Expression (NIME). Ann Arbor, Michigan: University of Michigan; 2012.In this paper, we discuss the design and testing of a reactive envi- ronment for musical performance. Driven by the interpersonal interactions amongst musicians, our system gives users, i.e., several musicians playing together in a band, real-time control over certain aspects of their performance, enabling them to change volume levels dynamically simply by moving around. It differs most notably from the majority of ventures into the design of novel musical interfaces and installations in its multidisciplinary approach, drawing on techniques from Human-Computer Interaction, social sciences and ludology. Our User-Centered Design methodology was central to producing an interactive environment that enhances traditional performance with novel functionalities. During a formal experiment, musicians reported finding our system exciting and enjoyable. We also introduce some additional interactions that can further enhance the interactivity of our reactive environment. In describing the particular challenges of working with such a unique and creative user as the musician, we hope that our approach can be of guidance to interface developers working on applications of a creative nature

    Gesture signature for ambient intelligence applications: a feasibility study

    No full text
    This work investigates the feasibility of a personal verification system using gestures as biometric signatures. Gestures are captured by low-power, low-cost tri-axial accelerometers integrated into an expansion pack for palmtop computers. The objective of our study is to understand whether the mobile system can recognize its owner by how she/he performs a particular gesture, acting as a gesture signature. The signature can be used for obtaining access to the mobile device, but the handheld device can also act as an intelligent key to provide access to services in an ambient intelligence scenario. Sample gestures are analyzed and classified using supervised and unsupervised dimensionality reduction techniques. Results on a set of benchmark gestures performed by several individuals are encouragin

    A Reactive Environment for Dynamic Volume Control

    Get PDF
    El-Shimy D, Hermann T, Cooperstock J. A Reactive Environment for Dynamic Volume Control. In: Essl GE, Gillespie B, Gurevich M, O'Modhrain S, eds. Proceedings of the International Conference on New Interfaces for Musical Expression (NIME). Ann Arbor, Michigan: University of Michigan; 2012.In this paper, we discuss the design and testing of a reactive envi- ronment for musical performance. Driven by the interpersonal interactions amongst musicians, our system gives users, i.e., several musicians playing together in a band, real-time control over certain aspects of their performance, enabling them to change volume levels dynamically simply by moving around. It differs most notably from the majority of ventures into the design of novel musical interfaces and installations in its multidisciplinary approach, drawing on techniques from Human-Computer Interaction, social sciences and ludology. Our User-Centered Design methodology was central to producing an interactive environment that enhances traditional performance with novel functionalities. During a formal experiment, musicians reported finding our system exciting and enjoyable. We also introduce some additional interactions that can further enhance the interactivity of our reactive environment. In describing the particular challenges of working with such a unique and creative user as the musician, we hope that our approach can be of guidance to interface developers working on applications of a creative nature
    corecore